Goto

Collaborating Authors

 online convex optimization



Fast Rates in Stochastic Online Convex Optimization by Exploiting the Curvature of Feasible Sets

Neural Information Processing Systems

In this work, we explore online convex optimization (OCO) and introduce a new condition and analysis that provides fast rates by exploiting the curvature of feasible sets. In online linear optimization, it is known that if the average gradient of loss functions exceeds a certain threshold, the curvature of feasible sets can be exploited by the follow-the-leader (FTL) algorithm to achieve a logarithmic regret. This study reveals that algorithms adaptive to the curvature of loss functions can also leverage the curvature of feasible sets. In particular, we first prove that if an optimal decision is on the boundary of a feasible set and the gradient of an underlying loss function is non-zero, then the algorithm achieves a regret bound of O ( ρ ln T) in stochastic environments. Here, ρ > 0 is the radius of the smallest sphere that includes the optimal decision and encloses the feasible set. Our approach, unlike existing ones, can work directly with convex loss functions, exploiting the curvature of loss functions simultaneously, and can achieve the logarithmic regret only with a local property of feasible sets. Additionally, the algorithm achieves an O ( T) regret even in adversarial environments, in which FTL suffers an Ω( T) regret, and achieves an O ( ρ ln T + Cρ ln T) regret in corrupted stochastic environments with corruption level C .







Beyond Online Balanced Descent: An Optimal Algorithm for Smoothed Online Optimization

Gautam Goel, Yiheng Lin, Haoyuan Sun, Adam Wierman

Neural Information Processing Systems

Weproveanewlower bound onthe competitive ratio of any online algorithm in the setting where the costs aremstrongly convex and the movement costs are the squared`2 norm. This lower bound shows that no algorithm can achieveacompetitiveratio that iso(m 1/2) asmtendstozero.